- linearly independent random variables
- линейно независимые случайные величины
English-Russian scientific dictionary. 2008.
English-Russian scientific dictionary. 2008.
Linear regression — Example of simple linear regression, which has one independent variable In statistics, linear regression is an approach to modeling the relationship between a scalar variable y and one or more explanatory variables denoted X. The case of one… … Wikipedia
Probability distribution — This article is about probability distribution. For generalized functions in mathematical analysis, see Distribution (mathematics). For other uses, see Distribution (disambiguation). In probability theory, a probability mass, probability density … Wikipedia
Brownian motion — This article is about the physical phenomenon; for the stochastic process, see Wiener process. For the sports team, see Brownian Motion (Ultimate). For the mobility model, see Random walk. Brownian motion (named after the botanist Robert Brown)… … Wikipedia
Ordinary least squares — This article is about the statistical properties of unweighted linear regression analysis. For more general regression analysis, see regression analysis. For linear regression on a single variable, see simple linear regression. For the… … Wikipedia
Linear independence — In linear algebra, a family of vectors is linearly independent if none of them can be written as a linear combination of finitely many other vectors in the collection. A family of vectors which is not linearly independent is called linearly… … Wikipedia
Matrix (mathematics) — Specific elements of a matrix are often denoted by a variable with two subscripts. For instance, a2,1 represents the element at the second row and first column of a matrix A. In mathematics, a matrix (plural matrices, or less commonly matrixes)… … Wikipedia
Eigenvalue, eigenvector and eigenspace — In mathematics, given a linear transformation, an Audio|De eigenvector.ogg|eigenvector of that linear transformation is a nonzero vector which, when that transformation is applied to it, changes in length, but not direction. For each eigenvector… … Wikipedia
Regression analysis — In statistics, regression analysis is a collective name for techniques for the modeling and analysis of numerical data consisting of values of a dependent variable (response variable) and of one or more independent variables (explanatory… … Wikipedia
Cochran's theorem — In statistics, Cochran s theorem, devised by William G. Cochran,[1] is a theorem used in to justify results relating to the probability distributions of statistics that are used in the analysis of variance.[2] Contents 1 Statement 2 … Wikipedia
Linear least squares (mathematics) — This article is about the mathematics that underlie curve fitting using linear least squares. For statistical regression analysis using least squares, see linear regression. For linear regression on a single variable, see simple linear regression … Wikipedia
Rotation matrix — In linear algebra, a rotation matrix is a matrix that is used to perform a rotation in Euclidean space. For example the matrix rotates points in the xy Cartesian plane counterclockwise through an angle θ about the origin of the Cartesian… … Wikipedia